Mutual information challenges entropy bounds
نویسنده
چکیده
We consider some formulations of the entropy bounds at the semiclassical level. The entropy S(V ) localized in a region V is divergent in quantum field theory (QFT). Instead of it we focus on the mutual information I(V,W ) = S(V ) + S(W ) − S(V ∪W ) between two different non-intersecting sets V and W . This is a low energy quantity, independent of the regularization scheme. In addition, the mutual information is bounded above by twice the entropy corresponding to the sets involved. Calculations of I(V,W ) in QFT show that the entropy in empty space cannot be renormalized to zero, and must be actually very large. We find that this entropy due to the vacuum fluctuations violates the FMW bound in Minkowski space. The mutual information also gives a precise, cutoff independent meaning to the statement that the number of degrees of freedom increases with the volume in QFT. If the holographic bound holds, this points to the essential non locality of the physical cutoff. Violations of the Bousso bound would require conformal theories and large distances. We speculate that the presence of a small cosmological constant might prevent such a violation.
منابع مشابه
Compressed Secret Key Agreement: Maximizing Multivariate Mutual Information per Bit
The multiterminal secret key agreement problem by public discussion is formulated with an additional source compression step where, prior to the public discussion phase, users independently compress their private sources to filter out strongly correlated components in order to generate a common secret key. The objective is to maximize the achievable key rate as a function of the joint entropy o...
متن کاملConvexity/concavity of renyi entropy and α-mutual information
Entropy is well known to be Schur concave on finite alphabets. Recently, the authors have strengthened the result by showing that for any pair of probability distributions P and Q with Q majorized by P , the entropy of Q is larger than the entropy of P by the amount of relative entropy D(P ||Q). This result applies to P and Q defined on countable alphabets. This paper shows the counterpart of t...
متن کاملRelations Between Conditional Shannon Entropy and Expectation of $\ell_{\alpha}$-Norm
The paper examines relationships between the conditional Shannon entropy and the expectation of `α-norm for joint probability distributions. More precisely, we investigate the tight bounds of the expectation of `α-norm with a fixed conditional Shannon entropy, and vice versa. As applications of the results, we derive the tight bounds between the conditional Shannon entropy and several informati...
متن کاملExtremal Relations Between Shannon Entropy and $\ell_{\alpha}$-Norm
The paper examines relationships between the Shannon entropy and the `α-norm for n-ary probability vectors, n ≥ 2. More precisely, we investigate the tight bounds of the `α-norm with a fixed Shannon entropy, and vice versa. As applications of the results, we derive the tight bounds between the Shannon entropy and several information measures which are determined by the `α-norm. Moreover, we app...
متن کاملSharp Bounds Between Two Rényi Entropies of Distinct Positive Orders
Many axiomatic definitions of entropy, such as the Rényi entropy, of a random variable are closely related to the `α-norm of its probability distribution. This study considers probability distributions on finite sets, and examines the sharp bounds of the `β-norm with a fixed `α-norm, α 6= β, for n-dimensional probability vectors with an integer n ≥ 2. From the results, we derive the sharp bound...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2006